Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Knowledge extraction method for follow-up data based on multi-term distillation network
WEI Chunwu, ZHAO Juanjuan, TANG Xiaoxian, QIANG Yan
Journal of Computer Applications    2021, 41 (10): 2871-2878.   DOI: 10.11772/j.issn.1001-9081.2020122059
Abstract304)      PDF (1052KB)(277)       Save
As medical follow-up work is more and more valued, the task of obtaining information related to the follow-up guidance through medical image analysis has become increasingly important. However, most deep learning-based methods are not suitable for dealing with such task. In order to solve the problem, a Multi-term Knowledge Distillation (MKD) model was proposed. Firstly, with the advantage of knowledge distillation in model transfer, the classification task with long-term follow-up information was converted into a model transfer task based on domain knowledge. Then, the follow-up knowledge contained in the long-term medical images was fully utilized to realize the long-term classification of lung nodules. At the same time, facing the problem that the data collected during the follow-up process were relatively unbalanced every year, a meta-learning method based normalization method was proposed, and therefore improving the training accuracy of the model in the semi-supervised mode effectively. Experimental results on NLST dataset show that, the proposed MKD model has better classification accuracy in the task of long-term lung nodule classification than the deep learning classification models such as GoogleNet. When the amount of unbalanced long-term data reaches 800 cases, the MKD enhanced by meta-learning method can improve the accuracy by up to 7 percentage points compared with the existing state-of-the-art models.
Reference | Related Articles | Metrics